Network Resource Allocation via Stochastic Subgradient Descent: Convergence Rate

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Constrained consumable resource allocation in alternative stochastic networks via multi-objective decision making

Many real projects complete through the realization of one and only one path of various possible network paths. Here, these networks are called alternative stochastic networks (ASNs). It is supposed that the nodes of considered network are probabilistic with exclusive-or receiver and exclusive-or emitter. First, an analytical approach is proposed to simplify the structure of t...

متن کامل

On the convergence rate of ordinal optimization for a class of stochastic discrete resource allocation problems

In [1], stochastic discrete resource allocation problems were considered which are hard due to the combinatorial explosion of the feasible allocation search space, as well as the absence of closed-form expressions for the cost function of interest. An ordinal optimization algorithm for solving a class of such problems was then shown to converge in probability to the global optimum. In this pape...

متن کامل

Efficient and Practical Stochastic Subgradient Descent for Nuclear Norm Regularization

We describe novel subgradient methods for a broad class of matrix optimization problems involving nuclear norm regularization. Unlike existing approaches, our method executes very cheap iterations by combining low-rank stochastic subgradients with efficient incremental SVD updates, made possible by highly optimized and parallelizable dense linear algebra operations on small matrices. Our practi...

متن کامل

On Stochastic Subgradient Mirror-Descent Algorithm with Weighted Averaging

This paper considers stochastic subgradient mirror-descent method for solving constrained convex minimization problems. In particular, a stochastic subgradient mirror-descent method with weighted iterate-averaging is investigated and its per-iterate convergence rate is analyzed. The novel part of the approach is in the choice of weights that are used to construct the averages. Through the use o...

متن کامل

Radial Subgradient Descent

We present a subgradient method for minimizing non-smooth, non-Lipschitz convex optimization problems. The only structure assumed is that a strictly feasible point is known. We extend the work of Renegar [1] by taking a different perspective, leading to an algorithm which is conceptually more natural, has notably improved convergence rates, and for which the analysis is surprisingly simple. At ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Communications

سال: 2018

ISSN: 0090-6778

DOI: 10.1109/tcomm.2018.2792430